Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@travetto/schema

Package Overview
Dependencies
Maintainers
0
Versions
323
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@travetto/schema

Data type registry for runtime validation, reflection and binding.

  • 5.0.13
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
414
increased by17.95%
Maintainers
0
Weekly downloads
 
Created
Source

Schema

Data type registry for runtime validation, reflection and binding.

Install: @travetto/schema

npm install @travetto/schema

# or

yarn add @travetto/schema

This module's purpose is to allow for proper declaration and validation of data types, in the course of running a program. The framework defined here, is leveraged in the Configuration, Command Line Interface, RESTful API, OpenAPI Specification and Data Modeling Support modules. The schema is the backbone of all data transfer, as it helps to provide validation on correctness of input, whether it is a rest request, command line inputs, or a configuration file.

This module provides a mechanism for registering classes and field level information as well the ability to apply that information at runtime.

Registration

The registry's schema information is defined by Typescript AST and only applies to classes registered with the @Schema decoration.

Classes

The module utilizes AST transformations to collect schema information, and facilitate the registration process without user intervention. The class can also be described using providing a:

  • title - definition of the schema
  • description - detailed description of the schema
  • examples - A set of examples as JSON or YAML The title will be picked up from the JSDoc comments, and additionally all fields can be set using the @Describe decorator.

Code: Sample User Schema

import { Schema } from '@travetto/schema';

@Schema()
export class User {
  name: string;
  age: number;
  favoriteFood?: 'pizza' | 'burrito' | 'salad';
  height?: `${number}${'m' | 'ft'}`;
}

From this schema, the registry would have the following information:

Config: User schemas as a YAML file

User:
  fields:
    -
      name: name
      type": string
      required: true
    -
      name: age
      type: number
      required: true
    -
      name: favoriteFood
      type: string
      required: false
      allowedValues: ["pizza", "burrito", "salad" ]

Fields

This schema provides a powerful base for data binding and validation at runtime. Additionally there may be types that cannot be detected, or some information that the programmer would like to override. Below are the supported field decorators:

  • @Field defines a field that will be serialized.
  • @Required defines a that field should be required
  • @Enum defines the allowable values that a field can have
  • @Match defines a regular expression that the field value should match
  • @MinLength enforces min length of a string
  • @MaxLength enforces max length of a string
  • @Min enforces min value for a date or a number
  • @Max enforces max value for a date or a number
  • @Email ensures string field matches basic email regex
  • @Telephone ensures string field matches basic telephone regex
  • @Url ensures string field matches basic url regex
  • @Ignore exclude from auto schema registration
  • @Integer ensures number passed in is only a whole number
  • @Float ensures number passed in allows fractional values
  • @Currency provides support for standard currency
  • @Text indicates that a field is expecting natural language input, not just discrete values
  • @LongText same as text, but expects longer form content
  • @Readonly defines a that field should not be bindable external to the class
  • @Writeonly defines a that field should not be exported in serialization, but that it can be bound to
  • @Secret marks a field as being sensitive. This is used by certain logging activities to ensure sensitive information is not logged out.
  • @Specifier attributes additional specifiers to a field, allowing for more specification beyond just the field's type.
  • @SubTypeField allows for promoting a given field as the owner of the sub type discriminator (defaults to type). Additionally, schemas can be nested to form more complex data structures that are able to bound and validated.

Just like the class, all fields can be defined with

  • description - detailed description of the schema
  • examples - A set of examples as JSON or YAML And similarly, the description will be picked up from the JSDoc comments, and additionally all fields can be set using the @Describe decorator.

Parameters

Parameters are available in certain scenarios (e.g. RESTful API endpoints and Command Line Interface main methods). In these scenarios, all of the field decorators are valid, but need to be called slightly differently to pass the typechecker. The simple solution is to use the Arg field of the decorator to convince Typescript its the correct type.

Code: Sample Parameter Usage

import { Match, Min } from '@travetto/schema';

const NAME_REGEX = /[A-Z][a-z]+(\s+[A-Z][a-z]+)*/;

export class ParamUsage {
  main(@Match(NAME_REGEX) name: string, @Min(20) age?: number) {
    console.log('Valid name and age!', { name, age });
  }
}

Binding/Validation

At runtime, once a schema is registered, a programmer can utilize this structure to perform specific operations. Specifically binding and validation.

Binding

Binding is a very simple operation, as it takes in a class registered as as @Schema and a JS object that will be the source of the binding. Given the schema:

Code: Sub Schemas via Address

import { Schema, Integer } from '@travetto/schema';

@Schema()
export class Address {
  street1: string;
  street2: string;
}

@Schema()
export class Person {
  name: string;
  @Integer() age: number;
  address: Address;
}

A binding operation could look like:

Code: Binding from JSON to Schema

import { Person } from './person';

export function Test(): Person {
  return Person.from({
    name: 'Test',
    age: 19.999978,
    address: {
      street1: '1234 Fun',
      street2: 'Unit 20'
    }
  });
}

and the output would be a Person instance with the following structure

Terminal: Sample data output after binding

$ trv main doc/person-output.ts

Person {
  name: 'Test',
  age: 19,
  address: Address { street1: '1234 Fun', street2: 'Unit 20' }
}

Note: Binding will attempt to convert/coerce types as much as possible to honor the pattern of Javascript and it's dynamic nature.

Validation

Validation is very similar to binding, but instead of attempting to assign values, any mismatch or violation of the schema will result in errors. All errors will be collected and returned. Given the same schema as above,

Code: Sub Schemas via Address

import { Schema, Integer } from '@travetto/schema';

@Schema()
export class Address {
  street1: string;
  street2: string;
}

@Schema()
export class Person {
  name: string;
  @Integer() age: number;
  address: Address;
}

But now with an invalid json object

Code: Read Person, and validate

import { SchemaValidator } from '@travetto/schema';

import { Person } from './person';

export async function validate(): Promise<void> {

  const person = Person.from({
    name: 'Test',
    age: 'abc',
    address: {
      street1: '1234 Fun'
    }
  });

  await SchemaValidator.validate(Person, person);
}

would produce an exception similar to following structure

Terminal: Sample error output

$ trv main doc/person-invalid-output.ts

Validation Failed {
  "message": "Validation errors have occurred",
  "category": "data",
  "type": "ValidationResultError",
  "at": "2029-03-14T04:00:00.618Z",
  "details": {
    "errors": [
      {
        "kind": "type",
        "type": "number",
        "message": "age is not a valid number",
        "path": "age"
      },
      {
        "kind": "required",
        "active": true,
        "message": "address.street2 is required",
        "path": "address.street2"
      }
    ]
  }
}

Custom Validators

Within the schema framework, it is possible to add custom validators class level. This allows for more flexibility when dealing with specific situations (e.g. password requirements or ensuring two fields match)

Code: Password Validator

import { Schema, Validator, ValidationError } from '@travetto/schema';

const passwordValidator = (user: User): ValidationError | undefined => {
  const p = user.password;
  const hasNum = /\d/.test(p);
  const hasSpecial = /[!@#$%%^&*()<>?/,.;':"']/.test(p);
  const noRepeat = !/(.)(\1)/.test(p);
  if (!hasNum || !hasSpecial || !noRepeat) {
    return {
      kind: 'password-rules',
      path: 'password',
      message: 'A password must include at least one number, one special char, and have no repeating characters'
    };
  }
};

@Schema()
@Validator(passwordValidator)
class User {
  password: string;
}

When the validator is executed, it has access to the entire object, and you can check any of the values. The validator expects an object of a specific structure if you are looking to indicate an error has occurred.

Code: Validation Error Structure

export interface ValidationError {
  /**
   * The error message
   */
  message: string;
  /**
   * The object path of the error
   */
  path: string;
  /**
   * The kind of validation
   */
  kind: ValidationKind;
  /**
   * The value provided
   */
  value?: unknown;
  /**
   * Regular expression to match
   */
  re?: string;
  /**
   * Number to compare against
   */
  n?: number | Date;
  /**
   * The type of the field
   */
  type?: string;
}

Custom Types

When working with the schema, the basic types are easily understood, but some of Typescript's more complex constructs are too complex to automate cleanly.

To that end, the module supports two concepts:

Type Adapters

This feature is meant to allow for simple Typescript types to be able to be backed by a proper class. This is because all of the typescript type information disappears at runtime, and so only concrete types (like classes) remain. An example of this, can be found with how the Data Model Querying module handles geo data.

Code: Simple Custom Type

import { DataUtil } from '@travetto/schema';

/**
 * @concrete #PointImpl
 */
export type Point = [number, number];

const INVALID = Symbol.for('invalid-point');

export class PointImpl {
  static validateSchema(input: unknown): 'type' | undefined {
    const ret = this.bindSchema(input);
    return ret !== INVALID && ret && !isNaN(ret[0]) && !isNaN(ret[1]) ? undefined : 'type';
  }

  static bindSchema(input: unknown): [number, number] | typeof INVALID | undefined {
    if (Array.isArray(input) && input.length === 2) {
      const [a, b] = input.map(x => DataUtil.coerceType(x, Number, false));
      return [a, b];
    } else {
      return INVALID;
    }
  }
}

What you can see here is that the Point type is now backed by a class that supports:

  • validateSchema - Will run during validation for this specific type.
  • bindSchema - Will run during binding to ensure correct behavior.

Code: Simple Custom Type Usage

import { Schema } from '@travetto/schema';
import { Point } from './custom-type';

@Schema()
export class LocationAware {
  name: string;
  point: Point;
}

All that happens now, is the type is exported, and the class above is able to properly handle point as an [x, y] tuple. All standard binding and validation patterns are supported, and type enforcement will work as expected.

Terminal: Custom Type Validation

$ trv main doc/custom-type-output.ts

Validation Failed {
  "message": "Validation errors have occurred",
  "category": "data",
  "type": "ValidationResultError",
  "at": "2029-03-14T04:00:00.837Z",
  "details": {
    "errors": [
      {
        "kind": "type",
        "type": "PointImpl",
        "message": "point is not a valid PointImpl",
        "path": "point"
      }
    ]
  }
}

Data Utilities

Data utilities for binding values, and type conversion. Currently DataUtil includes:

  • deepAssign(a, b, mode ?) which allows for deep assignment of b onto a, the mode determines how aggressive the assignment is, and how flexible it is. mode can have any of the following values:

    • loose, which is the default is the most lenient. It will not error out, and overwrites will always happen
    • coerce, will attempt to force values from b to fit the types of a, and if it can't it will error out
    • strict, will error out if the types do not match
  • coerceType(input: unknown, type: Class<unknown>, strict = true) which allows for converting an input type into a specified type instance, or throw an error if the types are incompatible.

  • shallowClone<T = unknown>(a: T): T will shallowly clone a field

  • filterByKeys<T>(obj: T, exclude: (string | RegExp)[]): T will filter a given object, and return a plain object (if applicable) with fields excluded using the values in the exclude input

Keywords

FAQs

Package last updated on 16 Nov 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc